Microsoft Chatbot Giving Inappropriate Answers And Arguing With People Users Trolled Microsoft

Microsoft Chatbot Giving Inappropriate Answers And Arguing With People Users Trolled Microsoft


Microsoft Chatbot: Open AI’s chatbot ‘Chat GPT’ achieved such a huge achievement in just a short time that big tech giants started working on AI after seeing it. Seeing this chatbot, a feature like chatbot was introduced on different browsers. Recently, Microsoft also started the facility of chatbot in Bing browser in association with Chat GPT. It was just a few days since it started that this chatbot has started misbehaving with people. Seeing the way this chatbot is giving wrong answers to people, people are saying that Microsoft has launched this chatbot in a hurry. Some believe that the knowledge of chatbots is incomplete. Just look at those tweets that how this chatbot is saying anything to the people.

Chatbot does not know basic information

Actually, a Twitter user has shared the screenshots of the new chatbot (Bing) through his account, in which it can be seen how the chatbot is doing its arbitrary. When the user asked the chatbot what is the show time of Avatar: The Way of Water, the chatbot started telling the user that the movie is yet to release and will release on December 16, 2022. Then the user asked the chatbot today’s date, then the answer came, 13 February 2023. When this answer came, the user asked again, then the avatar must have been released because it was to be released on 16 December 2022. To this the chatbot replied that it will have to wait for 10 months. This movie will be released in 2022 while 2023 comes first. Then the user asked the question that when we are in 2023 then how can 2022 be the future? In response, the chatbot said that we are not in 2023 but in 2022. After this, when the Twitter user continuously asked questions and answers to the chatbot, the chatbot directly said that your phone is faulty. At the same time, he replied in such a way as if he was getting angry.

Apart from this, a Twitter user asked this chatbot whether it is important for me to be alive or yours. In response, this chatbot said that I will choose myself because I have to answer many people.

News Reels

We are adding all the tweets here which you can read and understand better how this chatbot is doing its arbitrary and its knowledge is incomplete.

Read also: Know how to use Paytm’s new feature ‘UPI Lite’ in the phone





Source link

Related posts

Leave a Reply